🇬🇧 en fi 🇫🇮

Markov process noun

  • (probability theory) Any stochastic process for which the conditional probability distribution of future states depends only on the current state (and not on past states).
Markov-prosessi, Markovin prosessi
Wiktionary Links